Efficient feature selection using shrinkage estimators
نویسندگان
چکیده
منابع مشابه
Model Selection , Covariance Selection and Bayes Classification via Shrinkage Estimators
Statistics) MODEL SELECTION, COVARIANCE SELECTION AND BAYES CLASSIFICATION VIA SHRINKAGE ESTIMATORS by
متن کاملFeature selection with missing data using mutual information estimators
Feature selection is an important preprocessing task for many machine learning and pattern recognition applications, including regression and classification. Missing data are encountered in many real-world problems and have to be considered in practice. This paper addresses the problem of feature selection in prediction problems where some occurrences of features are missing. To this end, the w...
متن کاملKernel Mean Shrinkage Estimators
A mean function in a reproducing kernel Hilbert space (RKHS), or a kernel mean, is central to kernel methods in that it is used by many classical algorithms such as kernel principal component analysis, and it also forms the core inference step of modern kernel methods that rely on embedding probability distributions in RKHSs. Given a finite sample, an empirical average has been used commonly as...
متن کاملEfficient Multi-Label Feature Selection Using Entropy-Based Label Selection
Abstract: Multi-label feature selection is designed to select a subset of features according to their importance to multiple labels. This task can be achieved by ranking the dependencies of features and selecting the features with the highest rankings. In a multi-label feature selection problem, the algorithm may be faced with a dataset containing a large number of labels. Because the computati...
متن کاملPass-efficient unsupervised feature selection
The goal of unsupervised feature selection is to identify a small number of important features that can represent the data. We propose a new algorithm, a modification of the classical pivoted QR algorithm of Businger and Golub, that requires a small number of passes over the data. The improvements are based on two ideas: keeping track of multiple features in each pass, and skipping calculations...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Machine Learning
سال: 2019
ISSN: 0885-6125,1573-0565
DOI: 10.1007/s10994-019-05795-1